228 research outputs found
Farm Animals and Their Welfare in 2000
Farm animals have been a traditional concern of the modern animal protection movement. In the early 1800s, when the movement emerged as a significant sociopolitical force in the United Kingdom, its first priority was protection of farm animals, with particular emphasis on cattle and horses. Subsequently priorities changed, and throughout most of the 1900s, animal protectionism in Europe and the English-speaking world focused more strongly on the use of animals for scientific research and on the rescue of abandoned or ill-treated companion animals. Today, however, with vigorous public debate over animal agriculture and its effects, farm animals are re-emerging as a major subject of humane concern
Farm Animals and Their Welfare in 2000
Farm animals have been a traditional concern of the modern animal protection movement. In the early 1800s, when the movement emerged as a significant sociopolitical force in the United Kingdom, its first priority was protection of farm animals, with particular emphasis on cattle and horses. Subsequently priorities changed, and throughout most of the 1900s, animal protectionism in Europe and the English-speaking world focused more strongly on the use of animals for scientific research and on the rescue of abandoned or ill-treated companion animals. Today, however, with vigorous public debate over animal agriculture and its effects, farm animals are re-emerging as a major subject of humane concern
Recommended from our members
Providing Access to a Data Library: SQL and Full-Text IR Methods of Automatically Generating Web Structure
Social Science research in universities has traditionally been supported by a central data archive, holding both the raw data (e.g. the U.S. census) and the wide variety of support materials necessary to identify, understand and manipulate the raw data. Columbia University's Electronic Data Service (EDS) has built an online system using World-Wide-Web technology to offer these support materials, and ultimately the raw data itself, to our research community. We describe our historical motivations, our data format difficulties, our filing systems and our scalable technology solution. Our emphasis is on a new set of software, inter-connected by Web protocols, which is easy to use and is self-maintaining
Degree-Driven Design of Geometric Algorithms for Point Location, Proximity, and Volume Calculation
Correct implementation of published geometric algorithms is surprisingly difficult. Geometric algorithms are often designed for Real-RAM, a computational model that provides arbitrary precision arithmetic operations at unit cost. Actual commodity hardware provides only finite precision and may result in arithmetic errors. While the errors may seem small, if ignored, they may cause incorrect branching, which may cause an implementation to reach an undefined state, produce erroneous output, or crash. In 1999 Liotta, Preparata and Tamassia proposed that in addition to considering the resources of time and space, an algorithm designer should also consider the arithmetic precision necessary to guarantee a correct implementation. They called this design technique degree-driven algorithm design. Designers who consider the time, space, and precision for a problem up-front arrive at new solutions, gain further insight, and find simpler representations. In this thesis, I show that degree-driven design supports the development of new and robust geometric algorithms. I demonstrate this claim via several new algorithms. For n point sites on a UxU grid I consider three problems. First, I show how to compute the nearest neighbor transform in O(U^2) expected time, O(U^2) space, and double precision. Second, I show how to create a data structure in O(n log Un) expected time, O(n) expected space, and triple precision that supports O(log n) time and double precision post-office queries. Third, I show how to compute the Gabriel graph in O(n^2) time, O(n^2) space and double precision. For computing volumes of CSG models, I describe a framework that uses a minimal set of predicates that use at most five-fold precision. The framework is over 500x faster and two orders of magnitude more accurate than a Monte Carlo volume calculation algorithm.Doctor of Philosoph
Recommended from our members
DPLA Exchange and SimplyE, an open platform for e-content services
This presentation describes a pilot program spearheaded by the Digital Public Library of America (DPLA) to test a new model for a library-owned and library-centered ebook marketplace for popular ebooks, together with free public domain and openly-licensed ebooks. The goal of the program is to demonstrate how DPLA can help libraries maximize access to ebooks for their patrons. For the pilot, DPLA sought out a mix of library types including a state library, a consortium, and both a large public library and one serving smaller and rural populations. The diverse group of pilot libraries are: Alameda County Library (CA); Carnegie Library of Pittsburgh (PA); Connecticut State Library (CT); Califa Library Group (CA, KS); St. Mary’s County Library (MD) and Yavapai Library Network (AZ)
A Faithful Discretization of the Augmented Persistent Homology Transform
The persistent homology transform (PHT) represents a shape with a multiset of
persistence diagrams parameterized by the sphere of directions in the ambient
space. In this work, we describe a finite set of diagrams that discretize the
PHT such that it faithfully represents the underlying shape. We provide a
discretization that is exponential in the dimension of the shape (making it
Furthermore, we provide an output-sensitive algorithm; that is, the algorithm
reports the discretization in time proportional to the size of the
discretization. Finally, our approach relies only on knowing the heights and
dimensions of topological events, meaning that it can be adapted to provide
discretizations of other dimension-returning topological transforms, including
the Betti curve transform
Efficient Graph Reconstruction and Representation Using Augmented Persistence Diagrams
Persistent homology is a tool that can be employed to summarize the shape of
data by quantifying homological features. When the data is an object in
, the (augmented) persistent homology transform ((A)PHT) is a
family of persistence diagrams, parameterized by directions in the ambient
space. A recent advance in understanding the PHT used the framework of
reconstruction in order to find finite a set of directions to faithfully
represent the shape, a result that is of both theoretical and practical
interest. In this paper, we improve upon this result and present an improved
algorithm for graph -- and, more generally one-skeleton -- reconstruction. The
improvement comes in reconstructing the edges, where we use a radial binary
(multi-)search. The binary search employed takes advantage of the fact that the
edges can be ordered radially with respect to a reference plane, a feature
unique to graphs.Comment: This work originally appeared in the 2022 proceedings of the Canadian
Conference on Computational Geometry (CCCG). We have updated the proof of
Theorem 2 in Appendix A for clarity and correctness. We have also corrected
and clarified Section 3.2, as previously, it used slightly stricter general
position assumptions than those given in Assumption
From Curves to Words and Back Again: Geometric Computation of Minimum-Area Homotopy
Let be a generic closed curve in the plane. Samuel Blank, in his
1967 Ph.D. thesis, determined if is self-overlapping by geometrically
constructing a combinatorial word from . More recently, Zipei Nie, in
an unpublished manuscript, computed the minimum homotopy area of by
constructing a combinatorial word algebraically. We provide a unified framework
for working with both words and determine the settings under which Blank's word
and Nie's word are equivalent. Using this equivalence, we give a new geometric
proof for the correctness of Nie's algorithm. Unlike previous work, our proof
is constructive which allows us to naturally compute the actual homotopy that
realizes the minimum area. Furthermore, we contribute to the theory of
self-overlapping curves by providing the first polynomial-time algorithm to
compute a self-overlapping decomposition of any closed curve with
minimum area.Comment: 27 pages, 16 figure
- …